The Law of the Iterated Logarithm for the Integrated Squared Deviation of a Kernel Density Estimator

نویسنده

  • David M. Mason
چکیده

Let f n,K denote a kernel estimator of a density f in R such that R f p (x)dx<∞ for some p>2. It is shown, under quite general conditions on the kernel K and on the window sizes, that the centered integrated squared deviation of f n,K from its mean, f n,K −Ef n,K 2 2 −Ef n,K −Ef n,K 2 2 satisfies a law of the iterated logarithm. This is then used to obtain an LIL for the deviation from the true density, f n,K −f 2 2 −Ef n,K −f 2 2. The main tools are the Komlós-Major-Tusnády approximation, a moderate deviation result for triangular arrays of weighted chi-square variables adapted from Pinsky (1966) and an exponential inequality of Giné, Lataa la and Zinn (2000) for degenerate U-statistics applied in combination with decoupling and maximal inequalities. Runninghead: The LIL for L 2 functionals of kernel density estimators 1. Introduction. As far as we know there are no laws of the iterated logarithm for the integrated p-th absolute deviation of a kernel density estimator from its mean, although central limit theorems do exist This anomaly seems to be due to the fact that there are serious difficulties both to find the proper way in blocking and in deriving sufficiently precise moderate deviation results. In this paper we show how these difficulties can be handled in the case p = 2. In the process we shall spotlight a number of techniques that should be of independent interest. Unfortunately our methods do not extend to other values of p. In order to make our aim clear let us now fix some notation and introduce our basic assumptions. Throughout this paper we shall assume that f is a probability density on the real line R such that

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gamma Kernel Estimators for Density and Hazard Rate of Right-Censored Data

The nonparametric estimation for the density and hazard rate functions for right-censored data using the kernel smoothing techniques is considered. The “classical” fixed symmetric kernel type estimator of these functions performs well in the interior region, but it suffers from the problem of bias in the boundary region. Here, we propose new estimators based on the gamma kernels for the density...

متن کامل

The Relative Improvement of Bias Reduction in Density Estimator Using Geometric Extrapolated Kernel

One of a nonparametric procedures used to estimate densities is kernel method. In this paper, in order to reduce bias of  kernel density estimation, methods such as usual kernel(UK), geometric extrapolation usual kernel(GEUK), a bias reduction kernel(BRK) and a geometric extrapolation bias reduction kernel(GEBRK) are introduced. Theoretical properties, including the selection of smoothness para...

متن کامل

Asymptotic Behaviors of the Lorenz Curve for Left Truncated and Dependent Data

The purpose of this paper is to provide some asymptotic results for nonparametric estimator of the Lorenz curve and Lorenz process for the case in which data are assumed to be strong mixing subject to random left truncation. First, we show that nonparametric estimator of the Lorenz curve is uniformly strongly consistent for the associated Lorenz curve. Also, a strong Gaussian approximation for ...

متن کامل

Boundary Kernels for Distribution Function Estimation

• Boundary effects for kernel estimators of curves with compact supports are well known in regression and density estimation frameworks. In this paper we address the use of boundary kernels for distribution function estimation. We establish the ChungSmirnov law of iterated logarithm and an asymptotic expansion for the mean integrated squared error of the proposed estimator. These results show t...

متن کامل

Asymptotic Behaviors of Nearest Neighbor Kernel Density Estimator in Left-truncated Data

Kernel density estimators are the basic tools for density estimation in non-parametric statistics.  The k-nearest neighbor kernel estimators represent a special form of kernel density estimators, in  which  the  bandwidth  is varied depending on the location of the sample points. In this paper‎, we  initially introduce the k-nearest neighbor kernel density estimator in the random left-truncatio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004